COURSE INTRODUCTION AND APPLICATION INFORMATION


Course Name
Introduction to Neural Networks
Code
Semester
Theory
(hour/week)
Application/Lab
(hour/week)
Local Credits
ECTS
CE 470
Fall/Spring
3
0
3
5
Prerequisites
None
Course Language
English
Course Type
Elective
Course Level
First Cycle
Mode of Delivery -
Teaching Methods and Techniques of the Course
Course Coordinator -
Course Lecturer(s) -
Assistant(s) -
Course Objectives This course will introduce the fundamental principles and algorithms of Artificial Neural Network (ANN) systems. The course will cover many subjects including basic neuron model, simple perceptron, adaptive linear element, Least Mean Square (LMS) algorithm, Multi Layer Perceptron (MLP), Back Propagation (BP) learning algorithm, Radial Basis Function (RBF) networks, Self Organizing Maps (SOM) and Learning Vector Quantization (LVQ), Support Vector Machines (SVMs), Continuous time and discrete time Hopfield networks, classification techniques, pattern recognition, signal processing and control applications.
Learning Outcomes The students who succeeded in this course;
  • Describe basic artificial neural network models,
  • Use the most common ANN architectures and their learning algorithms for a specific application,
  • Explain the principles of supervised and unsupervised learning, and generalization ability,
  • Evaluate the practical considerations in applying ANNs to real classification, pattern recognition, signal processing and control problems,
  • Implement basic ANN models and algorithms using Matlab and its Neural Network Toolbox.
Course Description The following topics will be included in the course: The main neural network architectures and learning algorithms, perceptrons and the LMS algorithm, back propagation learning, radial basis function networks, support vector machines, Kohonen’s self organizing feature maps, Hopfield networks, artificial neural networks for signal processing, pattern recognition and control.
Related Sustainable Development Goals

 



Course Category

Core Courses
Major Area Courses
Supportive Courses
X
Media and Managment Skills Courses
Transferable Skill Courses

 

WEEKLY SUBJECTS AND RELATED PREPARATION STUDIES

Week Subjects Required Materials
1 Biological motivation. Historical remarks on artificial neural networks. Applications of artificial neural networks. A taxonomy of artificial neural network models and learning algorithms. Introduction. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
2 General artificial neuron model. Discretevalued perceptron model, threshold logic and their limitations. Discretetime (dynamical) Hopfield networks. Hebb’s rule. Connection wieght matrix as an outer product of memory patterns. Chapter 1. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
3 Supervised learning. Perceptron learning algorithm. Adaptive linear element. Supervised learning as output error minimization problem. Gradient descent algorithm for minimization. Least mean square rule. Chapter 2. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
4 Single layer, continuous valued perceptron. Nonlinear (sigmoidal) activation function. Delta rule. Batch mode and pattern mode gradient descent algorithms. Convergence conditions for deterministic and stochastic gradient descent algorithms. Chapter 3. Chapter 4: Sections 4.1, 4.2, 4.16. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
5 Multi layer perceptron as universal approximator. Function representation and approximation problems. Backpropagation Learning. Local minima problem. Overtraining. Chapter 4: Sections 4.4, 4.5, 4.8, 4.10, 4.12. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
6 Midterm Exam I. Batch and pattern mode training. Training set versus test set. Overfitting problem. General practices for network training and testing. Signal processing and pattern recognition applications of multilayer perceptrons. Chapter 4: Sections 4.3, 4.10., 4.11, 4.13, 4.14, 4.15, 4.19, 4.20. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
7 Radial Basis Function (RBF) network. Backpropagation learning for determining linear weights, centers and widths parameters of RBF networks. Random selection of centers. Input versus inputoutput clustering for center and width determination. Regularization theory, mixture of Gaussion (conditional probability density function) model and neurofuzzy connections of RBF networks. Chapter 5. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761 Lecture Notes.
8 Support vector machines for classification. Kernel representations. Generalization ability. VapnikChervonenkis dimension. Support vector regression. Comparison of different kernels, loss (error) functions and norms for (separating hyerplane) flatness. Chapter 6. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761 Lecture Notes.
9 Parametric versus nonparametric methods for data representation. Unsupervised learning as a vector quantization problem. Competetive networks. Winner take all network. Kohonen’s selforganizing feature map. Clustering. Chapter 9. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
10 Continuous time Hopfield networks. Stability analysis of multiple equilibria of Hopfield networks. Hopfield networks for cost minimization: Lyapunov (energy) based design of Hopfield networks. Associative memory. Traveling salesman problem. Combinatorial optimization. Chapter 13: Sections 13.1, 13.2, 13.3, 13.4, 13.5, 13.6, 13.7, S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
11 Midterm Exam II. Signal processing applications of artificial neural networks. Principle component analysis. Data compression and reduction. Image and 1D signal compression and transformation applications of artificial neural networks. Chapter 8. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761
12 Pattern recognition applications of artificial neural networks. Artificial neural networks for feature extraction. Nonlinear feature mapping. Data fusion. Artificial neural networks as classifiers. Image and speech recognition applications. Sections 1.4,1.5., 3.11, 4.7, 5.8, 6.7, S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
13 Control applications of artificial neural networks. Artificial neural networks for system identification. Artificial neural networks as controllers. Inverse systems design. Direct and indirect control methods. Adaptive control applications. Chapter 15: Section 15.3. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
14 Implementation of artificial neural networks models and associated learning algorithms for signal processing, pattern recognition and control in MATLAB numerical software environment. Lecture Notes.
15 Cumulative review of artificial neural networks models, learning algorithms and their applications. S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761. Lecture Notes.
16 Review of the Semester  
Course Notes/Textbooks S. Haykin, Neural Networks and Learning Machines, Pearson Education, 3rd Ed., 2009, ISBN13 9780131293762 ISBN10 0131293761
Suggested Readings/Materials J. M. Zurada, Int. To Artificial Neural Systems, West Publishing Company, 1992 ISBN 053495460X, 9780534954604.

 

EVALUATION SYSTEM

Semester Activities Number Weigthing
Participation
Laboratory / Application
Field Work
Quizzes / Studio Critiques
Portfolio
Homework / Assignments
5
20
Presentation / Jury
Project
1
30
Seminar / Workshop
Oral Exam
Midterm
2
50
Final Exam
Total

Weighting of Semester Activities on the Final Grade
100
Weighting of End-of-Semester Activities on the Final Grade
Total

ECTS / WORKLOAD TABLE

Semester Activities Number Duration (Hours) Workload
Course Hours
(Including exam week: 16 x total hours)
16
3
48
Laboratory / Application Hours
(Including exam week: 16 x total hours)
16
Study Hours Out of Class
15
3
45
Field Work
Quizzes / Studio Critiques
Portfolio
Homework / Assignments
5
3
Presentation / Jury
Project
1
24
Seminar / Workshop
Oral Exam
Midterms
2
9
Final Exams
    Total
150

 

COURSE LEARNING OUTCOMES AND PROGRAM QUALIFICATIONS RELATIONSHIP

#
Program Competencies/Outcomes
* Contribution Level
1
2
3
4
5
1

To have adequate knowledge in Mathematics, Science and Computer Engineering; to be able to use theoretical and applied information in these areas on complex engineering problems.

X
2

To be able to identify, define, formulate, and solve complex Computer Engineering problems; to be able to select and apply proper analysis and modeling methods for this purpose.

X
3

To be able to design a complex system, process, device or product under realistic constraints and conditions, in such a way as to meet the requirements; to be able to apply modern design methods for this purpose.

4

To be able to devise, select, and use modern techniques and tools needed for analysis and solution of complex problems in Computer Engineering applications; to be able to use information technologies effectively.

X
5

To be able to design and conduct experiments, gather data, analyze and interpret results for investigating complex engineering problems or Computer Engineering research topics.

X
6

To be able to work efficiently in Computer Engineering disciplinary and multi-disciplinary teams; to be able to work individually.

7

To be able to communicate effectively in Turkish, both orally and in writing; to be able to author and comprehend written reports, to be able to prepare design and implementation reports, to present effectively, to be able to give and receive clear and comprehensible instructions.

8

To have knowledge about global and social impact of Computer Engineering practices on health, environment, and safety; to have knowledge about contemporary issues as they pertain to engineering; to be aware of the legal ramifications of Computer Engineering solutions.

9

To be aware of ethical behavior, professional and ethical responsibility; to have knowledge about standards utilized in engineering applications.

10

To have knowledge about industrial practices such as project management, risk management, and change management; to have awareness of entrepreneurship and innovation; to have knowledge about sustainable development.

11

To be able to collect data in the area of Computer Engineering, and to be able to communicate with colleagues in a foreign language. ("European Language Portfolio Global Scale", Level B1)

12

To be able to speak a second foreign language at a medium level of fluency efficiently.

13

To recognize the need for lifelong learning; to be able to access information, to be able to stay current with developments in science and technology; to be able to relate the knowledge accumulated throughout the human history to Computer Engineering.

*1 Lowest, 2 Low, 3 Average, 4 High, 5 Highest